(working title)
Esben Lykke, PhD student
21 marts, 2023
It was likely a dead end from the get-go :(
Basic Features
ACC derived features1
Sensor-Independent Features2
Forger, Jewett, and Kronauer (1999): a so-called cubic van der Pol equation
\[\frac{dx_c}{dt}=\frac{\pi}{12}\begin{cases}\mu(x_c-\frac{4x^3}{3})-x\begin{bmatrix}(\frac{24}{0.99669\tau_x})^2+kB\end{bmatrix}\end{cases}\]
This thing is dependent on ambient light and body temperature!
Walch et al. (2019) incorporated this feature using step counts from the Apple Watch
But as demonstrated by Walch et al. (2019), a simple cosine function does the trick just as well :)
| Performance of the multiclass classifiers | |||||
| Decision Tree | Decision Tree SMOTE | Logistic Regression | Neural Network | XGboost | |
|---|---|---|---|---|---|
| F1 Score | 87.95% | 87.00% | 79.40% | 91.97% | 88.20% |
| Accuracy | 89.49% | 86.06% | 78.22% | 88.99% | 89.47% |
| Sensitivity | 89.49% | 86.06% | 78.22% | 88.99% | 89.47% |
| Precision | 88.19% | 88.16% | 80.38% | 89.30% | 87.99% |
| Specificity | 91.55% | 93.66% | 70.58% | 90.66% | 91.39% |
| Weighted macro average. Metrics are calculated by taking the average of the scores for each class, weighted by the number of samples in each class. | |||||
| Performance of the two binary classifiers | ||||
| Decision Tree | Logistic Regression | Neural Network | XGboost | |
|---|---|---|---|---|
| In-bed Prediction | ||||
| F1 Score | 92.35% | 88.77% | 92.39% | 92.41% |
| Accuracy | 93.79% | 91.48% | 93.86% | 93.82% |
| Sensitivity | 91.85% | 82.42% | 91.23% | 92.17% |
| Precision | 92.87% | 96.19% | 93.58% | 92.66% |
| Specificity | 95.13% | 97.74% | 95.68% | 94.96% |
| Sleep Prediction | ||||
| F1 Score | 87.74% | 84.56% | 88.12% | 88.39% |
| Accuracy | 91.11% | 89.78% | 91.51% | 91.67% |
| Sensitivity | 91.98% | 80.90% | 90.98% | 91.71% |
| Precision | 83.87% | 88.57% | 85.43% | 85.30% |
| Specificity | 90.64% | 94.48% | 91.80% | 91.65% |
| Performance of the binary relevance classifiers | ||||
| Decision Tree | Logistic Regression | Neural Network | XGboost | |
|---|---|---|---|---|
| In-bed Asleep Prediction | ||||
| F1 Score | 87.68% | 84.56% | 88.19% | 88.44% |
| Accuracy | 91.06% | 89.78% | 91.50% | 91.71% |
| Sensitivity | 92.04% | 80.90% | 91.70% | 91.69% |
| Precision | 83.72% | 88.57% | 84.94% | 85.42% |
| Specificity | 90.54% | 94.48% | 91.40% | 91.73% |
| In-Bed Awake Prediction | ||||
| F1 Score | 22.22% | 0.00% | 20.15% | 25.48% |
| Accuracy | 69.68% | 93.74% | 94.04% | 93.48% |
| Sensitivity | 69.19% | 0.00% | 12.00% | 17.79% |
| Precision | 13.24% | 0.00% | 62.77% | 44.87% |
| Specificity | 69.71% | 100.00% | 99.52% | 98.54% |
| Out-Bed Awake Prediction | ||||
| F1 Score | 94.79% | 93.14% | 94.88% | 94.81% |
| Accuracy | 93.87% | 91.48% | 93.90% | 93.85% |
| Sensitivity | 94.41% | 97.74% | 95.49% | 94.96% |
| Precision | 95.18% | 88.95% | 94.27% | 94.66% |
| Specificity | 93.07% | 82.42% | 91.60% | 92.24% |
https://github.com/esbenlykke/sleep_study